Can 'Social Determinants' Data Really Improve Patient Care?

— Population-level research supports the concept, but benefit for individual patients less clear

Last Updated August 8, 2018
MedpageToday

Fictional TV doctor Gregory House routinely ordered his team to break into patients' homes to find clues to their mystery ailments, perhaps revealing behaviors their patients failed to share.

"Everybody lies," House often said.

In what some suggest is another ethically questionable version of medical sleuthing, companies are now scanning public records that provide clues to individuals' "social determinants of health," or SDOH -- such as arrest records, bankruptcy filings, voter registration, address changes, and marriages and divorces -- that, in combination with traditional prognostic tools, may predict an individual's likelihood of future healthcare needs and costs.

Patients' medical history and what they tell their physicians is no longer enough -- or that's the pitch, anyway, from companies that gather these SDOH data.

They are packaging it in algorithms for sale to insurers and health systems, providing a window into social factors and behaviors that put roadblocks in their patients' path to good health. Paired with clinical data, the idea is that providers can then tailor care plans to clear the good-health path. And, of course, lower costs.

Asked how a physician might use this information, Anton Berisha, MD, senior director of clinical analytics for LexisNexis Risk Solutions, one of the larger companies marketing these tools, gave examples.

Records of "legal encounters," for instance, could alert a provider to bring up financial assistance or support networks, he said.

'Red Dot' Patients

SDOH data could put a colored dot on a patient's chart, with red indicating a low-income socioeconomic environment, Berisha said. For such a patient, "I'm going to probably think of writing a prescription for a generic drug that costs $4 at Walmart rather than slapping a member with a $300 or $400 brand-name drug." Berisha added that the physician would have to assure that the replacement generic version of the drug, or a similar drug, is close enough to the brand-name drug to provide an acceptably equivalent benefit.

Or, the physician might hand a patient with transportation problems a prescription for a 90-day supply, instead of the customary 30 days -- thus reducing the number of trips to the pharmacy, and the chances that the patient would end up non-adherent.

LexisNexis says it uses up to 442 separate types of data on as many as 279 million identities, including bankruptcy, lien and judgment records, motor vehicle and driving history, accident reports, certain major purchases, Social Security numbers, email addresses, and much more.

Its proprietary (read: secret) models use the data to develop two types of risk scores, one for the total cost projected for that patient over the next 12 months, and another for the risk of 30-day readmission after hospital discharge, he said.

Specific information about a patient isn't revealed, he emphasized, so the clinician doesn't know about an arrest for example, only that there are risk factors. He emphasized that the algorithms do not contain credit ratings, bank balances, social media history, travel information, data on relatives, or a patient's net worth.

Help for the Food Insecure

Trenor Williams, MD, is founder and CEO of Socially Determined in Washington, D.C., which acquires datasets like these and helps health systems and other groups, such as accountable care organizations, design effective interventions for populations that reduce cost of care, and thus risk. He sees great potential.

"We might implement a program around the diabetic that also has food insecurity and transportation risk. Maybe you implement food delivery," he said.

Rather than waiting for a patient with poorly controlled diabetes to make an appointment, the provider team would proactively reach out to that patient to schedule a visit before they normally would.

Practically speaking, care coordinators or social workers would absorb these beefed up tasks for the physician.

Williams described a specific use of the SDOH data: his organization works with ride-sharing companies such as Lyft to coordinate non-emergency transportation services for patients who, the risk scores indicate, are likely to have difficulty getting to the doctor's office. The programs are especially effective for pregnant women who need office visits for prenatal care to prevent complications that could result in expensive stays in a neonatal intensive care unit, he said.

Peter Long, president and CEO of Blue Shield Foundation of California, thinks the use of social determinants' "big data" as another facet of precision medicine, tailored for a patient's non-health needs instead of one's genetic code.

As chair of last year's National Academy of Medicine report, Effective Care for High-Need Patients, he's excited about its potential to improve outcomes. "It's very evident to me that the health care system has had a Rip Van Winkle moment" about what it is and how it could be used, he said.

But providers don't agree on what it is, don't know if it's a good thing or not, and evidence that it actually works is sparse, he agreed. It could result in inappropriate meals delivered to a patient with a food allergy by a non-profit that wasn't fully vetted, Long said.

"When things just get reduced to an algorithm, and we're making decisions that affect peoples' well-being, we don't always get it right. And who's going to accept responsibility for that if it's wrong?"

No Panacea

Indeed, some clinicians and health policy officials say this trend raises troublesome ethical and privacy issues and could interfere with the trusted physician-patient relationship, especially since the patient is not usually informed their non-health data might be used to recommend care.

To date, it's unclear whether patients are given an opportunity to opt in or out, and many sources believe they are not.

While having this information might be useful, "it could actually set back the cause of looking into social determinants of health," said Nancy Adler of the University of California San Francisco. Adler, co-chair of a 2014 National Academy of Medicine report on a related topic, said "Patients will start to worry that very sensitive information is getting into their electronic health record."

Ashish Jha, MD, director of the Harvard Global Health Institute who has served on committees examining this topic, has concerns. As a practical matter, there's no evidence these algorithms actually predict riskier populations, he said, and little evidence on what interventions would change the course for these patients.

"Show me actual evidence that what you've done has made any kind of difference," Jha said. "Until then, I'm a skeptic."

Jha added, "There's a long history of people building algorithms like this and most of them have not done much. Besides, most doctors already know when their patients are in trouble. They may not have known the patient had a DUI, for example, but they might have known the patient had issues with substance abuse."

"Sure, I can point to anecdotes where this computer system told me something I didn't know and that changed the course. But I'm not aware of the data that shows these things [algorithms] really do that," Jha said.

Berisha of LexisNexis countered that while some may think doctors should know which of their patients have special needs, this kind of "clinically validated data was not readily available to health care providers or health plans until now." Doctors can now know much more about their patients.

Karen Joynt Maddox, MD, of Washington University in St. Louis, was also skeptical of SDOH data for guiding individual patient care.

"Here's the rub," she said. "We don't have perfect data on what to do once we've determined that people are high risk."

Some trials have shown promising interventions to improve medication adherence, like transportation and housing, "though not as much as people hoped," she said. Overall, it signals the need to beef up better care coordination with such efforts as more frequent home visits, or diabetes education.

And providers are already doing that.

Slippery Slope?

Some patient advocates are uneasy. Casey Quinlan, a particularly outspoken breast cancer survivor -- who sports a QR code to her EHR tattooed on her chest as a statement -- drew the line.

"They're right to look at social determinants but, holy hat," she said. "It's much better to have a conversation with [the patient] than to suck up a whole bunch of data about them, and then make a conclusion based on what you see on a dashboard, as opposed to what you actually know about them.... To me, this is one of the most slippery of slopes."

She and others worry that the tactic may have an underlying motive, which is to give insurance companies information that allows them to eventually discriminate where they offer coverage, or set higher prices.

Deven McGraw, JD, MPH, former deputy director for health information privacy for the U.S. Health and Human Services Office for Civil Rights, who now works with the California startup Ciitizen, said that on its face, evaluating risk with non-health data might be a good thing.

"Particularly when social and economic determinants, like stress levels of life, poverty, the presence of a gun in the home, tend to have very high correlations to health status" it makes sense, she said.

But, she added, there's "the scary part." If information on high-cost, high-risk patients is used by health plans and others to determine pricing or limit coverage areas. That's especially worrisome in light of discussions underway that threaten to remove the Affordable Care Act's prohibition against preexisting condition exclusions, she said.

Berisha denied that is the intent.

But there are other, subtle ways health providers and insurers could use the information to impede access for residents with higher risk, she said. "The provider might be thinking of where to put a new or expanded office and decides not to go into a certain zip code because they can see the likelihood they can operate without being significantly in the red all the time is pretty low," McGraw said.

McGraw also said such data mining has "pretty significant ethical dilemmas that HIPAA [Health Insurance Portability and Accountability Act] doesn't really address," which is whether non-health information when combined with a patient's medical history is protected health information. That needs to be clarified, she said.